Stochastic Receptive Fields in Deep Convolutional Networks
نویسندگان
چکیده
منابع مشابه
Stochastic Receptive Fields in Deep Convolutional Networks
Deep convolutional neural networks (ConvNets) have rapidly grown in popularity due to their powerful capabilities in representing and modelling the high-level abstraction of complex data. However, ConvNets require an abundance of data to adequately train network parameters. To tackle this problem, we introduce the concept of stochastic receptive fields, where the receptive fields are stochastic...
متن کاملSelecting Receptive Fields in Deep Networks
Recent deep learning and unsupervised feature learning systems that learn from unlabeled data have achieved high performance in benchmarks by using extremely large architectures with many features (hidden units) at each layer. Unfortunately, for such large architectures the number of parameters can grow quadratically in the width of the network, thus necessitating hand-coded “local receptive fi...
متن کاملUnderstanding the Effective Receptive Field in Deep Convolutional Neural Networks
We study characteristics of receptive fields of units in deep convolutional networks. The receptive field size is a crucial issue in many visual tasks, as the output must respond to large enough areas in the image to capture information about large objects. We introduce the notion of an effective receptive field, and show that it both has a Gaussian distribution and only occupies a fraction of ...
متن کاملWhat are the Receptive, Effective Receptive, and Projective Fields of Neurons in Convolutional Neural Networks?
In this work, we explain in detail how receptive fields, effective receptive fields, and projective fields of neurons in different layers, convolution or pooling, of a Convolutional Neural Network (CNN) are calculated. While our focus here is on CNNs, the same operations, but in the reverse order, can be used to calculate these quantities for deconvolutional neural networks. These are important...
متن کاملBuilding Sparse Deep Feedforward Networks using Tree Receptive Fields
Sparse connectivity is an important factor behind the success of convolutional neural networks and recurrent neural networks. In this paper, we consider the problem of learning sparse connectivity for feedforward neural networks (FNNs). The key idea is that a unit should be connected to a small number of units at the next level below that are strongly correlated. We use Chow-Liu’s algorithm to ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Vision Letters
سال: 2015
ISSN: 2369-6753
DOI: 10.15353/vsnl.v1i1.42